Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.

نویسندگان

  • Daniel E Callan
  • Jeffery A Jones
  • Kevin Munhall
  • Christian Kroos
  • Akiko M Callan
  • Eric Vatikiotis-Bateson
چکیده

Perception of speech is improved when presentation of the audio signal is accompanied by concordant visual speech gesture information. This enhancement is most prevalent when the audio signal is degraded. One potential means by which the brain affords perceptual enhancement is thought to be through the integration of concordant information from multiple sensory channels in a common site of convergence, multisensory integration (MSI) sites. Some studies have identified potential sites in the superior temporal gyrus/sulcus (STG/S) that are responsive to multisensory information from the auditory speech signal and visual speech movement. One limitation of these studies is that they do not control for activity resulting from attentional modulation cued by such things as visual information signaling the onsets and offsets of the acoustic speech signal, as well as activity resulting from MSI of properties of the auditory speech signal with aspects of gross visual motion that are not specific to place of articulation information. This fMRI experiment uses spatial wavelet bandpass filtered Japanese sentences presented with background multispeaker audio noise to discern brain activity reflecting MSI induced by auditory and visual correspondence of place of articulation information that controls for activity resulting from the above-mentioned factors. The experiment consists of a low-frequency (LF) filtered condition containing gross visual motion of the lips, jaw, and head without specific place of articulation information, a midfrequency (MF) filtered condition containing place of articulation information, and an unfiltered (UF) condition. Sites of MSI selectively induced by auditory and visual correspondence of place of articulation information were determined by the presence of activity for both the MF and UF conditions relative to the LF condition. Based on these criteria, sites of MSI were found predominantly in the left middle temporal gyrus (MTG), and the left STG/S (including the auditory cortex). By controlling for additional factors that could also induce greater activity resulting from visual motion information, this study identifies potential MSI sites that we believe are involved with improved speech perception intelligibility.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Brain regions differentially involved with multisensory and visual only speech gesture information

In this study a vowel identification task, controlling for intelligibility confounds, using audio visual stimuli at different signal to noise levels as well as visual only stimuli, is conducted to investigate neural processes involved with visual gesture information for speech perception. The fMRI results suggest that visual speech gesture information may serve to facilitate speech perception u...

متن کامل

Movement Sonification: Motion perception, behavioral effects and functional data

The sonification of human movement data is a new approach to support motor perception, motor control and learning in sports and rehabilitation. Perceptual and motor control mechanisms both benefit of additional acoustic information, both are performing more accurate compared to unimodal conditions. But how does movement sonification work when supporting perception and action especially when pro...

متن کامل

Links between Gestures and Multisensory Processing: Individual Differences Suggest a Compensation Mechanism

Speech-associated gestures represent an important communication modality. However, individual differences in the production and perception of gestures are not well understood so far. We hypothesized that the perception of multisensory action consequences might play a crucial role. Verbal communication involves continuous calibration of audio-visual information produced by the speakers. The effe...

متن کامل

Multisensory and sensorimotor interactions in speech perception

This research topic presents speech as a natural, well-learned, multisensory communication signal, processed by multiple mechanisms. Reflecting the general status of the field, most articles focus on audiovisual speech perception and many utilize the McGurk effect, which arises when discrepant visual and auditory speech stimuli are presented (McGurk and MacDonald, 1976). Tiippana (2014) argues ...

متن کامل

The Effect of Combined Sensory and Semantic Components on Audio–Visual Speech Perception in Older Adults

Previous studies have found that perception in older people benefits from multisensory over unisensory information. As normal speech recognition is affected by both the auditory input and the visual lip movements of the speaker, we investigated the efficiency of audio and visual integration in an older population by manipulating the relative reliability of the auditory and visual information in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Journal of cognitive neuroscience

دوره 16 5  شماره 

صفحات  -

تاریخ انتشار 2004